New Results on Combination Methods for Boosting Ensembles
نویسندگان
چکیده
The design of an ensemble of neural networks is a procedure that can be decomposed into two steps. The first one consists in generating the ensemble, i.e., training the networks with significant differences. The second one consists in combining properly the information provided by the networks. Adaptive Boosting, one of the best performing ensemble methods, has been studied and improved by some authors including us. Moreover, Adaboost and its variants use a specific combiner based on the error of the networks. Unfortunately, any deep study on combining this kind of ensembles has not been done yet. In this paper, we study the performance of some important ensemble combiners on ensembles previously trained with Adaboost and Aveboost. The results show that an extra increase of performance can be provided by applying the appropriate combiner.
منابع مشابه
Decision Fusion on Boosting Ensembles
Training an ensemble of neural networks is an interesting way to build a Multi-net System. One of the key factors to design an ensemble is how to combine the networks to give a single output. Although there are some important methods to build ensembles, Boosting is one of the most important ones. Most of methods based on Boosting use an specific combiner (Boosting Combiner). Although the Boosti...
متن کاملAn experimental study on diversity for bagging and boosting with linear classifiers
In classifier combination, it is believed that diverse ensembles have a better potential for improvement on the accuracy than nondiverse ensembles. We put this hypothesis to a test for two methods for building the ensembles: Bagging and Boosting, with two linear classifier models: the nearest mean classifier and the pseudo-Fisher linear discriminant classifier. To estimate diversity, we apply n...
متن کامل‘Fuzzy’ vs ‘Non-fuzzy’ in Combining Classifiers Designed by Boosting
Boosting is recognized as one of the most successful techniques for generating classifier ensembles. Typically, the classifier outputs are combined by the weighted majority vote. The purpose of this study is to demonstrate the advantages of some fuzzy combination methods for ensembles of classifiers designed by Boosting. We ran 2-fold cross-validation experiments on 6 benchmark data sets to com...
متن کاملA New Classifiers Ensemble Method for Handwritten Pen Digits Classification
Recent researches have shown that ensembles of classifiers have more accuracy than a single classifier. Baging, boosting and error correcting output codes (ECOC) are most common ways for creating combination of classifiers. In this paper a new method for ensemble of classifiers has been introduced and performance of this method examined by applying to handwritten pen digits dataset. Experimenta...
متن کاملA comparative study of classifier ensembles for bankruptcy prediction
The aim of bankruptcy prediction in the areas of data mining and machine learning is to develop an effective model which can provide the higher prediction accuracy. In the prior literature, various classification techniques have been developed and studied, in/with which classifier ensembles by combining multiple classifiers approach have shown their outperformance over many single classifiers. ...
متن کاملPopular Ensemble Methods: An Empirical Study
An ensemble consists of a set of individually trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble is often more accurate than any of the single classifiers in the ensemble. Bagging (Breiman, 1996c) and Boosting (Freund & Schapire, 1996; Schapire, 1990) are two relatively new...
متن کامل